Low Rank SVM

نویسندگان

  • Hueihan Jhuang
  • Lior Wolf
چکیده

Recently, some research try to incorporate the 2D structure of images into dimensionality reduction process, like 2DPCA [3] and CSA [4]. Some work use high order tensor to represent image ensembles, where the factors may include different faces, facial expression, viewpoints and illuminations [5, 6]. All these efforts indeed provide good performance but tend to separate the dimensionality reduction stage from supervised learning stage.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-rank Audio Signal Classification Under Soft Margin and Trace Norm Constraints

We propose an algorithm to do speech/non-speech classification based on low-rank matrix representative audio data. Conventionally, the low-rank matrix data can be represented by a vector in high dimensional space. Some learning algorithms are then applied in such a vector space for matrix data classification. Particularly, maximum margin classifiers, such as support vector machine (SVM) etc. ha...

متن کامل

Scaling up Kernel SVM on Limited Resources: A Low-rank Linearization Approach

Kernel Support Vector Machine delivers state-of-the-art results in non-linear classification, but the need to maintain a large number of support vectors poses a challenge in large scale training and testing. In contrast, linear SVM is much more scalable even on limited computing recourses (e.g. daily life PCs), but the learned model cannot capture non-linear concepts. To scale up kernel SVM on ...

متن کامل

Reduction Techniques for Training Support Vector Machines

Recently two kinds of reduction techniques which aimed at saving training time for SVM problems with nonlinear kernels were proposed. Instead of solving the standard SVM formulation, these methods explicitly alter the SVM formulation, and solutions for them are used to classify data. The first approach, reduced support vector machine (RSVM) [21], preselects a subset of data as support vectors a...

متن کامل

Efficient Multiple Kernel Learning Algorithms Using Low-Rank Representation

Unlike Support Vector Machine (SVM), Multiple Kernel Learning (MKL) allows datasets to be free to choose the useful kernels based on their distribution characteristics rather than a precise one. It has been shown in the literature that MKL holds superior recognition accuracy compared with SVM, however, at the expense of time consuming computations. This creates analytical and computational diff...

متن کامل

Cost-Sensitive Learning of SVM for Ranking

In this paper, we propose a new method for learning to rank. ‘Ranking SVM’ is a method for performing the task. It formulizes the problem as that of binary classification on instance pairs and performs the classification by means of Support Vector Machines (SVM). In Ranking SVM, the losses for incorrect classifications of instance pairs between different rank pairs are defined as the same. We n...

متن کامل

Cost-Sensitive Support Vector Ranking for Information Retrieval

In recent years, the algorithms of learning to rank have been proposed by researchers. However, in information retrieval, instances of ranks are imbalanced. After the instances of ranks are composed to pairs, the pairs of ranks are imbalanced too. In this paper, a cost-sensitive risk minimum model of pairwise learning to rank imbalanced data sets is proposed. Following this model, the algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006